158 research outputs found

    An Event Structure Model for Probabilistic Concurrent Kleene Algebra

    Full text link
    We give a new true-concurrent model for probabilistic concurrent Kleene algebra. The model is based on probabilistic event structures, which combines ideas from Katoen's work on probabilistic concurrency and Varacca's probabilistic prime event structures. The event structures are compared with a true-concurrent version of Segala's probabilistic simulation. Finally, the algebraic properties of the model are summarised to the extent that they can be used to derive techniques such as probabilistic rely/guarantee inference rules.Comment: Submitted and accepted for LPAR19 (2013

    A proof of burns N-process mutual exclusion algorithm using abstraction

    Full text link

    Algebraic Principles for Rely-Guarantee Style Concurrency Verification Tools

    Full text link
    We provide simple equational principles for deriving rely-guarantee-style inference rules and refinement laws based on idempotent semirings. We link the algebraic layer with concrete models of programs based on languages and execution traces. We have implemented the approach in Isabelle/HOL as a lightweight concurrency verification tool that supports reasoning about the control and data flow of concurrent programs with shared variables at different levels of abstraction. This is illustrated on two simple verification examples

    Mathematical and computer modeling of electro-optic systems using a generic modeling approach

    Get PDF
    The conventional approach to modelling electro-optic sensor systems is to develop separate models for individual systems or classes of system, depending on the detector technology employed in the sensor and the application. However, this ignores commonality in design and in components of these systems. A generic approach is presented for modelling a variety of sensor systems operating in the infrared waveband that also allows systems to be modelled with different levels of detail and at different stages of the product lifecycle. The provision of different model types (parametric and image-flow descriptions) within the generic framework can allow valuable insights to be gained

    Formal verification techniques for model transformations: A tridimensional classification

    Get PDF
    In Model Driven Engineering (Mde), models are first-class citizens, and model transformation is Mde's "heart and soul". Since model transformations are executed for a family of (conforming) models, their validity becomes a crucial issue. This paper proposes to explore the question of the formal verification of model transformation properties through a tridimensional approach: the transformation involved, the properties of interest addressed, and the formal verification techniques used to establish the properties. This work is intended for a double audience. For newcomers, it provides a tutorial introduction to the field of formal verification of model transformations. For readers more familiar with formal methods and model transformations, it proposes a literature review (although not systematic) of the contributions of the field. Overall, this work allows to better understand the evolution, trends and current practice in the domain of model transformation verification. This work opens an interesting research line for building an engineering of model transformation verification guided by the notion of model transformation intent

    Public understandings of addiction: where do neurobiological explanations fit?

    Get PDF
    Developments in the field of neuroscience, according to its proponents, offer the prospect of an enhanced understanding and treatment of addicted persons. Consequently, its advocates consider that improving public understanding of addiction neuroscience is a desirable aim. Those critical of neuroscientific approaches, however, charge that it is a totalising, reductive perspective–one that ignores other known causes in favour of neurobiological explanations. Sociologist Nikolas Rose has argued that neuroscience, and its associated technologies, are coming to dominate cultural models to the extent that 'we' increasingly understand ourselves as 'neurochemical selves'. Drawing on 55 qualitative interviews conducted with members of the Australian public residing in the Greater Brisbane area, we challenge both the 'expectational discourses' of neuroscientists and the criticisms of its detractors. Members of the public accepted multiple perspectives on the causes of addiction, including some elements of neurobiological explanations. Their discussions of addiction drew upon a broad range of philosophical, sociological, anthropological, psychological and neurobiological vocabularies, suggesting that they synthesised newer technical understandings, such as that offered by neuroscience, with older ones. Holding conceptual models that acknowledge the complexity of addiction aetiology into which new information is incorporated suggests that the impact of neuroscientific discourse in directing the public's beliefs about addiction is likely to be more limited than proponents or opponents of neuroscience expect

    Local conservation scores without a priori assumptions on neutral substitution rates

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Comparative genomics aims to detect signals of evolutionary conservation as an indicator of functional constraint. Surprisingly, results of the ENCODE project revealed that about half of the experimentally verified functional elements found in non-coding DNA were classified as unconstrained by computational predictions. Following this observation, it has been hypothesized that this may be partly explained by biased estimates on neutral evolutionary rates used by existing sequence conservation metrics. All methods we are aware of rely on a comparison with the neutral rate and conservation is estimated by measuring the deviation of a particular genomic region from this rate. Consequently, it is a reasonable assumption that inaccurate neutral rate estimates may lead to biased conservation and constraint estimates.</p> <p>Results</p> <p>We propose a conservation signal that is produced by local Maximum Likelihood estimation of evolutionary parameters using an optimized sliding window and present a Kullback-Leibler projection that allows multiple different estimated parameters to be transformed into a conservation measure. This conservation measure does not rely on assumptions about neutral evolutionary substitution rates and little a priori assumptions on the properties of the conserved regions are imposed. We show the accuracy of our approach (KuLCons) on synthetic data and compare it to the scores generated by state-of-the-art methods (phastCons, GERP, SCONE) in an ENCODE region. We find that KuLCons is most often in agreement with the conservation/constraint signatures detected by GERP and SCONE while qualitatively very different patterns from phastCons are observed. Opposed to standard methods KuLCons can be extended to more complex evolutionary models, e.g. taking insertion and deletion events into account and corresponding results show that scores obtained under this model can diverge significantly from scores using the simpler model.</p> <p>Conclusion</p> <p>Our results suggest that discriminating among the different degrees of conservation is possible without making assumptions about neutral rates. We find, however, that it cannot be expected to discover considerably different constraint regions than GERP and SCONE. Consequently, we conclude that the reported discrepancies between experimentally verified functional and computationally identified constraint elements are likely not to be explained by biased neutral rate estimates.</p

    Speckle-free laser imaging

    Full text link
    Many imaging applications require increasingly bright illumination sources, motivating the replacement of conventional thermal light sources with light emitting diodes (LEDs), superluminescent diodes (SLDs) and lasers. Despite their brightness, lasers and SLDs are poorly suited for full-field imaging applications because their high spatial coherence leads to coherent artifacts known as speckle that corrupt image formation. We recently demonstrated that random lasers can be engineered to provide low spatial coherence. Here, we exploit the low spatial coherence of specifically-designed random lasers to perform speckle-free full-field imaging in the setting of significant optical scattering. We quantitatively demonstrate that images generated with random laser illumination exhibit higher resolution than images generated with spatially coherent illumination. By providing intense laser illumination without the drawback of coherent artifacts, random lasers are well suited for a host of full-field imaging applications from full-field microscopy to digital light projector systems.Comment: 5 pages, 4 figure
    • …
    corecore